Empirical likelihood and uniform convergence rates for dyadic kernel density estimation

نویسندگان

چکیده

This article studies the asymptotic properties of and alternative inference methods for kernel density estimation (KDE) dyadic data. We first establish uniform convergence rates KDE. Second, we propose a modified jackknife empirical likelihood procedure inference. The proposed test statistic is asymptotically pivotal regardless presence clustering. results are further extended to cover practically relevant case incomplete Simulations show that this likelihood-based delivers precise coverage probabilities even with modest sample sizes Finally, illustrate method by studying airport congestion in United States.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Uniform Convergence Rates for Kernel Density Estimation

Kernel density estimation (KDE) is a popular nonparametric density estimation method. We (1) derive finite-sample high-probability density estimation bounds for multivariate KDE under mild density assumptions which hold uniformly in x ∈ R and bandwidth matrices. We apply these results to (2) mode, (3) density level set, and (4) class probability estimation and attain optimal rates up to logarit...

متن کامل

Maximum likelihood kernel density estimation

Methods for improving the basic kernel density estimator include variable locations, variable bandwidths (often called variable kernels) and variable weights. Currently these methods are implemented separately and via pilot estimation of variation functions derived from asymptotic considerations. In this paper, we propose a simple maximum likelihood procedure which allows (in its greatest gener...

متن کامل

Convergence Rates of Active Learning for Maximum Likelihood Estimation

An active learner is given a class of models, a large set of unlabeled examples, and the ability to interactively query labels of a subset of these examples; the goal of the learner is to learn a model in the class that fits the data well. Previous theoretical work has rigorously characterized label complexity of active learning, but most of this work has focused on the PAC or the agnostic PAC ...

متن کامل

Convergence Rates in Nonparametric Bayesian Density Estimation

We consider Bayesian density estimation for compactly supported densities using Bernstein mixtures of beta-densities equipped with a Dirichlet prior on the distribution function. We derive the rate of convergence for α-smooth densities for 0 < α ≤ 2 and show that a faster rate of convergence can be obtained by using fewer terms in the mixtures than proposed before. The Bayesian procedure adapts...

متن کامل

ARE for Testing; Convergence Rate of Kernel Density Estimation

The t-test is as defined in the previous lecture. It has slope 1/σ. The Mann-Whitney test rejects if 1 nm ∑ i,j I(Xi ≤ Yj) is large. Note. The Mann-Whitney statistic has a relationship with the area under the ROC curve (AUC) for classification algorithms with a tunable parameter. The ROC plot has one axis for proportion of false positives and one axis for proportion of true positives; as we mov...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Business & Economic Statistics

سال: 2022

ISSN: ['1537-2707', '0735-0015']

DOI: https://doi.org/10.1080/07350015.2022.2080684